Introduction to Neural Networks "Energy" and attractor networks Hopfield networks

ثبت نشده
چکیده

‡ Supervised learning. Introduced the idea of a “cost” function over weight space Regression and learning in linear neural networks. The cost was the sum of squared differences between the networks predictions of the correct answers and the correct answers. The motivation was to derive a “learning rule” that adjusts (synaptic) weights to minimize the discrepancy between predictions and answers. Last time we showed 4 different ways to find the generating parameters {a,b} = {2, 3} for data with the following generative process: rsurface@a_, b_D := N@Table@8x1 = 1 RandomReal@D, x2 = 1 RandomReal@D, a x1 + b x2 + 0.5 RandomReal@D 0.25<, 8120<D, 2D; data = rsurface@2, 3D; Outdata = data@@All, 3DD; Indata = data@@All, 1 ;; 2DD;

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optical neural networks with terminal attractors for pattern recognition

Junji Ohtsubo, MEMBER SPIE Shizuoka University Faculty of Engineering 3-5-1 Johoku Hamamatsu 432, Japan E-mail: [email protected] Abstract. The neural network system with terminal attractors is proposed for pattern recognition. By the introduction of the terminal attractors, the spurious states of the energy function in the Hopfield neural networks can be avoided and a unique solution ...

متن کامل

Attractor Dynamics in Feedforward Neural Networks

We study the probabilistic generative models parameterized by feedforward neural networks. An attractor dynamics for probabilistic inference in these models is derived from a mean field approximation for large, layered sigmoidal networks. Fixed points of the dynamics correspond to solutions of the mean field equations, which relate the statistics of each unit to those of its Markov blanket. We ...

متن کامل

Catastrophic Forgetting and the Pseudorehearsal Solution in Hopfield Networks

Most artificial neural networks suffer from the problem of catastrophic forgetting, where previously learnt information is suddenly and completely lost when new information is learnt. Memory in real neural systems does not appear to suffer from this unusual behaviour. In this thesis we discuss the problem of catastrophic forgetting in Hopfield networks, and investigate various potential solutio...

متن کامل

Structure and Dynamics of Random Recurrent Neural Networks

In contradiction with Hopfield-like networks, random recurrent neural networks (RRNN), where the couplings are random, exhibit complex dynamics (limit cycles, chaos). It is possible to store information in these networks through hebbian learning. Eventually, learning “destroys” the dynamics and leads to a fixed point attractor. We investigate here the structural change in the networks through l...

متن کامل

History-Dependent Attractor Neural Networks

We present a methodological framework enabling a detailed description of the performance of Hopfield-like attractor neural networks (ANN) in the first two iterations. Using the Bayesian approach, we find that performance is improved when a history-based term is included in the neuron's dynamics. A further enhancement of the network's performance is achieved by judiciously choosing the censored ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012